Orthogonal-Least-Squares Forward Selection for
نویسنده
چکیده
The objective of modelling from data is not that the model simply fits the training data well. Rather, the goodness of a model is characterized by its generalization capability, interpretability and ease for knowledge extraction. All these desired properties depend crucially on the ability to construct appropriate parsimonious models by the modelling process, and a basic principle in practical nonlinear data modelling is the parsimonious principle of ensuring the smallest possible model that explains the training data. There exists a vast amount of works in the area of sparse modelling, and a widely adopted approach is based on the linear-in-the-parameters data modelling that include the radial basis function network, the neurofuzzy network and all the sparse kernel modelling techniques. A well tested strategy for parsimonious modelling from data is the orthogonal least squares (OLS) algorithm for forward selection modelling, which is capable of constructing sparse models that generalise well. This contribution continues this theme and provides a unified framework for sparse modelling from data that includes regression and classification, which belong to supervised learning, and probability density function estimation, which is an unsupervised learning problem. The OLS forward selection method based on the leave-one-out test criteria is presented within this unified data-modelling framework. Examples from regression, classification and density estimation applications are used to illustrate the effectiveness of this generic parsimonious modelling approach from data.
منابع مشابه
Efficient computational schemes for the orthogonal least squares algorithm
The orthogonal least squares (OM) algorithm is an efficient implementation of the forward selection method for subset model selection. The ability to find good subset parameters with only a linearly increasing computational requirement makes this method attractive lor practical implementations. In this correspondence, we examine the computational complexity of the algorithm and present a prepro...
متن کاملSparse model identification using orthogonal forward regression with basis pursuit and D-optimality - Control Theory and Applications, IEE Proceedings-
An efficient model identification algorithm for a large class of linear-in-the-parameters models is introduced that simultaneously optimises the model approximation ability, sparsity and robustness. The derived model parameters in each forward regression step are initially estimated via the orthogonal least squares (OLS), followed by being tuned with a new gradient-descent learning algorithm ba...
متن کاملAn Orthogonal Forward Regression Algorithm Combined with Basis Pursuit and D-optimality
A new forward regression model identification algorithm is introduced. The derived model parameters, in each forward regression step, are initially estimated via orthogonal least squares (OLS) (using the modified Gram-Schmidt procedure), followed by being tuned with a new gradient descent learning algorithm based on the basis pursuit that minimizes the norm of the parameter estimate vector. The...
متن کاملParsimonious least squares support vector regression using orthogonal forward selection with the generalised kernel model
A sparse regression modelling technique is developed using a generalised kernel model in which each kernel regressor has its individually tuned position (centre) vector and diagonal covariance matrix. An orthogonal least squares forward selection procedure is employed to append the regressors one by one. After the determination of the model structure, namely the selection of an appropriate numb...
متن کاملRobust nonlinear model identification methods using forward regression
In this correspondence new robust nonlinear model construction algorithms for a large class of linear-in-the-parameters models are introduced to enhance model robustness via combined parameter regularization and new robust structural selective criteria. In parallel to parameter regularization, we use two classes of robust model selection criteria based on either experimental design criteria tha...
متن کامل